Stochastic zeroth-order discretizations of Langevin diffusions for Bayesian inference

نویسندگان

چکیده

Discretizations of Langevin diffusions provide a powerful method for sampling and Bayesian inference. However, such discretizations require evaluation the gradient potential function. In several real-world scenarios, obtaining evaluations might either be computationally expensive, or simply impossible. this work, we propose analyze stochastic zeroth-order algorithms discretizing overdamped underdamped diffusions. Our approach is based on estimating gradients, Gaussian Stein’s identities, widely used in optimization literature. We comprehensive oracle complexity analysis – number noisy function to made obtain an ϵ-approximate sample Wasserstein distance both diffusions, under various noise models. theoretical contributions extend applicability black-box settings arising practice.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic Programming

In this paper, we introduce a new stochastic approximation (SA) type algorithm, namely the randomized stochastic gradient (RSG) method, for solving an important class of nonlinear (possibly nonconvex) stochastic programming (SP) problems. We establish the complexity of this method for computing an approximate stationary point of a nonlinear programming problem. We also show that this method pos...

متن کامل

Bayesian sequential inference for nonlinear multivariate diffusions

In this paper, we adapt recently developed simulation-based sequential algorithms to the problem concerning the Bayesian analysis of discretely observed diffusion processes. The estimation framework involves the introduction of m−1 latent data points between every pair of observations. Sequential MCMC methods are then used to sample the posterior distribution of the latent data and the model pa...

متن کامل

Variational Bayesian Inference for Partially Observed Diffusions

In this paper the variational Bayesian approximation for partially observed continuous time stochastic processes is studied. We derive an EM-like algorithm and give its implementations. The variational Expectation step is explicitly solved using the method of conditional moment generating functions and stochastic partial differential equations. The numerical experiments demonstrate that the var...

متن کامل

Stochastic Zeroth-order Optimization in High Dimensions

We consider the problem of optimizing a high-dimensional convex function using stochastic zeroth-order queries. Under sparsity assumptions on the gradients or function values, we present two algorithms: a successive component/feature selection algorithm and a noisy mirror descent algorithm using Lasso gradient estimates, and show that both algorithms have convergence rates that depend only loga...

متن کامل

Bayesian Learning via Stochastic Gradient Langevin Dynamics

In this paper we propose a new framework for learning from large scale datasets based on iterative learning from small mini-batches. By adding the right amount of noise to a standard stochastic gradient optimization algorithm we show that the iterates will converge to samples from the true posterior distribution as we anneal the stepsize. This seamless transition between optimization and Bayesi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Bernoulli

سال: 2022

ISSN: ['1573-9759', '1350-7265']

DOI: https://doi.org/10.3150/21-bej1400